Generalized Relative AG Divergence of Type S and Information Inequalities

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relative information of type s, Csiszár's f-divergence, and information inequalities

During past years Dragomir has contributed a lot of work providing different kinds of bounds on the distance, information and divergence measures. In this paper, we have unified some of his results using the relative information of type s and relating it with the Csisz ar’s f -divergence. 2003 Elsevier Inc. All rights reserved.

متن کامل

Relative Divergence Measures and Information Inequalities

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...

متن کامل

Generalized Relative Information and Information Inequalities

In this paper, we have obtained bounds on Csiszár’s f-divergence in terms of relative information of type s using Dragomir’s [9] approach. The results obtained in particular lead us to bounds in terms of χ−Divergence, Kullback-Leibler’s relative information and Hellinger’s discrimination.

متن کامل

On the Integral Representations of Generalized Relative Type and Generalized Relative Weak Type of Entire Functions

In this paper we wish to establish the integral representations of generalized relative type and generalized relative weak type as introduced by Datta et al [9]. We also investigate their equivalence relation under some certain conditions.

متن کامل

Generalized Symmetric Divergence Measures and Inequalities

, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IOSR Journal of Mathematics

سال: 2012

ISSN: 2319-765X,2278-5728

DOI: 10.9790/5728-0432231